Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Improved AdaNet based on adaptive learning rate optimization
LIU Ran, LIU Yu, GU Jinguang
Journal of Computer Applications    2020, 40 (10): 2804-2810.   DOI: 10.11772/j.issn.1001-9081.2020020237
Abstract315)      PDF (1134KB)(559)       Save
AdaNet (Adaptive structural learning of artificial neural Networks) is a neural architecture search framework based on Boosting ensemble learning, which can create high-quality models through integrated subnets. The difference between subnets generated by the existing AdaNet is not significant, which limits the reduction of generalization error in ensemble learning. In the two steps of AdaNet:setting subnet network weights and integrating subnets, Adagrad, RMSProp (Root Mean Square Prop), Adam, RAdam (Rectified Adam) and other adaptive learning rate methods were used to improve the existing optimization algorithms in AdaNet. The improved optimization algorithms were able to provide different degrees of learning rate scaling for different dimensional parameters, resulting in a more dispersed weight distribution, so as to increase the diversity of subnets generated by AdaNet, thereby reducing the generalization error of ensemble learning. The experimental results show that on the three datasets:MNIST (Mixed National Institute of Standards and Technology database), Fashion-MNIST and Fashion-MNIST with Gaussian noise, the improved optimization algorithms can improve the search speed of AdaNet, and more diverse subnets generated by the method can improve the performance of the ensemble model. For the F1 value, which is an index to evaluate the model performance, compared with the original method, the improved methods have the largest improvement of 0.28%, 1.05% and 1.10% on the three datasets.
Reference | Related Articles | Metrics
Evolution relationship extraction of emergency based on attention-based bidirectional long short-term memory network model
WEN Chang, LIU Yu, GU Jinguang
Journal of Computer Applications    2019, 39 (6): 1646-1651.   DOI: 10.11772/j.issn.1001-9081.2018122533
Abstract387)      PDF (973KB)(365)       Save
Concerning the problem that existing study of emergency relationship extraction mostly focuses on causality extraction while neglects other evolutions, in order to improve the completeness of information extracted in emergency decision-making, a method based on attention-based bidirectional Long Short-Term Memory (LSTM) model was used to extract the evolution relationship. Firstly, combined with the concept of evolution relationship in emergencies, an evolution relationship model was constructed and given the formal definition, and the emergency corpus was labeled according to the model. Then, a bidirectional LSTM network was built and attention mechanism was introduced to calculate the attention probability to highlight the importance of the key words in the text. Finally, the built network model was used to extract the evolution relationship. In the evolution relationship extraction experiments, compared with the existing causality extraction methods, the proposed method can extract more sufficient evolution relationship for emergency decision-making. At the same time, the average precision, recall and F1_score are respectively increased by 7.3%, 6.7% and 7.0%, which effectively improves the accuracy of the evolution relationship extraction of emergency.
Reference | Related Articles | Metrics